High Performance Associative Memory Models and Symmetric Connections

نویسندگان

  • N. Davey
  • R. G. Adams
  • S. P. Hunt
چکیده

Two existing high capacity training rules for the standard Hopfield architecture associative memory are examined. Both rules, based on the perceptron learning rule produce asymmetric weight matrices, for which the simple dynamics (only point attractors) of a symmetric network can no longer be guaranteed. This paper examines the consequences of imposing a symmetry constraint in learning. The mean size of attractor basins of trained patterns and the mean time for learning convergence are analysed for the networks that arise from these learning rules, in both the asymmetric and symmetric instantiations. It is concluded that a symmetry constraint does not have any adverse affect on performance but that it does offer benefits in learning time and in network dynamics.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient architectures for sparsely-connected high capacity associative memory models

In physical implementations of associative memory, wiring costs play a significant role in shaping patterns of connectivity. In this study of sparsely-connected associative memory, a range of architectures is explored in search of optimal connection strategies which maximise pattern-completion performance, while at the same time minimising wiring costs. It is found that architectures in which t...

متن کامل

High Capacity, Small World Associative Memory Models NEIL DAVEY, LEE CALCRAFT and ROD ADAMS

Models of associative memory usually have full connectivity or if diluted, random symmetric connectivity. In contrast, biological neural systems have predominantly local, non-symmetric connectivity. Here we investigate sparse networks of threshold units, trained with the perceptron learning rule. The units are given position and are arranged in a ring. The connectivity graph varies between bein...

متن کامل

Pattern Recognition in Neural Networks with Competing Dynamics: Coexistence of Fixed-Point and Cyclic Attractors

We study the properties of the dynamical phase transition occurring in neural network models in which a competition between associative memory and sequential pattern recognition exists. This competition occurs through a weighted mixture of the symmetric and asymmetric parts of the synaptic matrix. Through a generating functional formalism, we determine the structure of the parameter space at no...

متن کامل

Efficient associative memory using small-world architecture

Most models of neural associative memory have used networks with broad connectivity. However, from both a neurobiological viewpoint and an implementation perspective, it is logical to minimize the length of inter-neural connections and consider networks whose connectivity is predominantly local. The `small-world networksa model described recently by Watts and Strogatz provides an interesting ap...

متن کامل

High Performance Associative Memory and Weight Dilution

The consequences of diluting the weights of the standard Hopfield architecture associative memory model, trained using perceptron like learning rules, is examined. A proportion of the weights of the network are removed; this can be done in a symmetric and asymmetric way and both methods are investigated. This paper reports experimental investigations into the consequences of dilution in terms o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000